Bridging Source and Target Word Embeddings for Neural Machine Translation

نویسندگان

  • Shaohui Kuang
  • Junhui Li
  • Deyi Xiong
چکیده

Neural machine translation systems encode a source sequence into a vector from which a target sequence is generated via a decoder. Different from the traditional statistical machine translation, source and target words are not directly mapped to each other in translation rules. They are at the two ends of a long information channel in the encoder-decoder neural network, separated by source and target hidden states. This may lead to translations with inconceivable word alignments. In this paper, we try to bridge source and target word embeddings so as to shorten their distance. We propose three strategies to bridge them: 1) a source state bridging model that moves source word embeddings one step closer to their counterparts, 2) a target state bridging model that explores relevant source word embeddings for target state prediction, and 3) a direct link bridging model that directly connects source and target word embeddings so as to minimize their discrepancy. Experiments and analysis demonstrate that the proposed bridging models are able to significantly improve quality of both translation and word alignments.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Embedding Word Similarity with Neural Machine Translation

Neural language models learn word representations, or embeddings, that capture rich linguistic and conceptual information. Here we investigate the embeddings learned by neural machine translation models, a recently-developed class of neural language model. We show that embeddings from translation models outperform those learned by monolingual models at tasks that require knowledge of both conce...

متن کامل

Not All Neural Embeddings are Born Equal

Neural language models learn word representations that capture rich linguistic and conceptual information. Here we investigate the embeddings learned by neural machine translation models. We show that translation-based embeddings outperform those learned by cutting-edge monolingual models at single-language tasks requiring knowledge of conceptual similarity and/or syntactic role. The findings s...

متن کامل

Convolutional Encoders for Neural Machine Translation

We propose a general Convolutional Neural Network (CNN) encoder model for machine translation that fits within in the framework of Encoder-Decoder models proposed by Cho, et. al. [1]. A CNN takes as input a sentence in the source language, performs multiple convolution and pooling operations, and uses a fully connected layer to produce a fixed-length encoding of the sentence as input to a Recur...

متن کامل

Character-based Neural Machine Translation

Neural Machine Translation (MT) has reached state-of-the-art results. However, one of the main challenges that neural MT still faces is dealing with very large vocabularies and morphologically rich languages. In this paper, we propose a neural MT system using character-based embeddings in combination with convolutional and highway layers to replace the standard lookup-based word representations...

متن کامل

Learning Bilingual Projections of Embeddings for Vocabulary Expansion in Machine Translation

We propose a simple log-bilinear softmaxbased model to deal with vocabulary expansion in machine translation. Our model uses word embeddings trained on significantly large unlabelled monolingual corpora and learns over a fairly small, wordto-word bilingual dictionary. Given an out-of-vocabulary source word, the model generates a probabilistic list of possible translations in the target language...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1711.05380  شماره 

صفحات  -

تاریخ انتشار 2017